- The new head of the NTSB called Tesla's use of the term Full Self-Driving "misleading and irresponsible," the Wall Street Journal reported.
- Elon Musk has admitted FSD is "not great" and has room to improve.
- The moniker has been criticized by regulators who say it can make drivers think the cars are fully autonomous when that is not the case.
- See more stories on Insider's business page.
A top US safety regulator says Tesla needs to address some major safety concerns, and she criticized the company's use of the term Full Self-Driving in its driver-assistance technology as "irresponsible," according to a Wall Street Journal report published Sunday.
Tesla CEO Elon Musk announced last week that Tesla drivers can soon expect a newer version of FSD, which is an enhanced version of Autopilot, a driver-assistance software that comes with every Tesla vehicle. FSD doesn't make the car fully autonomous but does allow the vehicle to change lanes, park itself, and recognize traffic lights and stop signs.
However, Jennifer Homendy, the new head of the National Transportation Safety Board, told the Journal in an interview that the upcoming release is premature.
"Basic safety issues have to be addressed before they're then expanding it to other city streets and other areas," Homendy told the Journal.
Homendy called the electric car maker's use of the term Full Self-Driving "misleading and irresponsible." That moniker, as well as the name Autopilot, have been criticized by regulators and lawmakers who say it can make drivers think the cars are fully autonomous when that is not the case.
"It has clearly misled numerous people to misuse and abuse technology," she told the Journal.
Last month, Musk said FSD software was "not great" and had room for improvements. While Musk said the newest and unreleased version of the technology was "much improved," in July he advised in a tweet that drivers should be "paranoid."
-Elon Musk (@elonmusk) August 23, 2021
Tesla did not respond to Insider's request to comment on the next FSD update.
Experts advise drivers know of the software's limitations before they get behind the wheel. Homendy also said that those with regulatory and enforcement power, which the NTSB does not have, should be aggressively trying to regulate driver-assistance technology for consumer safety.
US safety regulators launched an investigation into Autopilot after a number of Teslas struck vehicles at first-responder scenes. On Thursday, a driver was arrested on suspicion of drunken driving after she crashed into a Southern California freeway wall while her car was on Autopilot, Insider reported.
The US National Highway Traffic Safety Administration (NHTSA) earlier this year opened an investigation into Autopilot's role in 30 crashes that killed 10 people. NHTSA has already ruled out the Autopilot system in three of those crashes.